Key Findings
This analysis draws on 2,046 CalFresh applications submitted online in San Diego County. It includes applicant details, online activity, and final approval outcomes — but does not capture actions taken outside the platform, such as mailed documents or phone interviews.
1. Factors Associated With Approval
To identify what drives approval, I fit a logistic regression model using variables selected for their program relevance, observed user behavior, and patterns found in exploratory analysis. The goal was not just to predict outcomes, but to understand which steps in the process matter most — and where applicants might drop off.
Key predictors included income, document uploads, and interview completion. For interpretability, income was scaled in $500 increments. Interview completion was recoded to retain applicants with missing responses, allowing us to capture meaningful differences in follow-through.
Key Findings:
- Applicants who confirmed completing the interview had a predicted approval rate of 72%, compared to 50% for those who didn’t confirm.
- Uploading documents with the application increased approval chances.
- Higher income reduced the likelihood of approval — even among mostly income-eligible applicants.
- Having children in the household was associated with higher approval odds.
- Housing stability and application time didn’t show strong associations once other factors were considered.
The table below shows model results using odds ratios — a way to estimate how each factor affects the odds of approval, controlling for all others. For example, an odds ratio of 1.5 means the odds of approval are 50% higher for that group compared to the baseline.
The model explained a meaningful amount of variation in approval outcomes (McFadden R² = 0.15) and correctly distinguished approved vs. denied applications 76% of the time (AUC = 0.76). These results suggest the model performs well given the limited behavioral data available from the application platform.
To make the results easier to interpret, the table below shows predicted approval rates for example applicant scenarios, based on the fitted model.
These results show that relatively small steps — like completing an interview or uploading a document — can meaningfully increase the chance of approval. Many of these steps could be supported through timely nudges, simpler workflows, or user-centered reminders.
2. Potential Improvements
The model points to clear opportunities to improve approval outcomes by supporting applicants at key decision points.
Recommendations:
- Support interview completion: Many eligible applicants do not confirm completing the interview. Providing reminders, real-time scheduling, or alternative follow-up methods could increase follow-through.
- Encourage early document uploads: Uploading documents with the application was strongly associated with approval. Nudging users to upload early — especially those likely to qualify — could reduce denials.
- Address geographic disparities: Approval rates vary significantly by ZIP code. Further analysis could explore whether these reflect staffing, broadband access, or population needs — and help inform place-based outreach strategies.
Strengthening these steps would not only increase approval rates, but also improve equity and access for those most in need.